V-P cache: a storage efficient virtual cache organization

نویسندگان

  • Sang Lyul Min
  • Jesung Kim
  • Chong-Sang Kim
  • Heonshik Shin
  • Deog-Kyoon Jeong
چکیده

Introduction In recent several years, processor speeds have improved dramatically. This drastic increase in processor speeds makes a cache with a very fast hit time a must for high performance computer systems. The above trends make the use of direct-mapped virtual caches very attractive since they inherit the hit time advantage from both types (i.e., direct-mapped and virtual) of caches. However, they also inherit the disadvantages of both types of caches. Being direct-mapped caches, they have lower hit ratios than set-associative caches of comparable size. Moreover, being virtual caches, they have to wrestle with the well-known synonym problem 1 that results from more than one copy of the same data in the cache under diierent virtual addresses. The synonym problem does not occur in physical caches since each block in the physical cache is tagged and accessed by the physical address that is unique for every memory block. To prevent the synonym problem, a scheme, called an anti-aliasing scheme, is required in a virtual cache to guarantee that every block in the cache has a unique physical address. An anti-aliasing scheme can be implemented either in hardware 2?4 or software 5?7. This paper proposes a novel hardware-based anti-aliasing technique that improves not only the miss ratio but also the storage utilisation of the direct-mapped virtual caches. The key to the proposed scheme is the incorporation of a secondary mapping function to select the set. This secondary mapping function is applied only when the cache access based on the primary set selection function, which is usually a bit selection function from the virtual 1 address, is a miss. The secondary mapping function we use is a bit selection function from the physical address. Therefore, in our scheme, a given memory block can be placed in two diierent sets in the cache, one based on the virtual address and the other based on the physical address. One beneet of such remapping is that it can eliminate many misses due to connicts among frequently used blocks that happen to be mapped to the same set by the primary mapping function. Another important beneet of the above mentioned remapping is that it reduces the so-called anti-aliasing misses that result from the accesses to the blocks previously evicted from the cache for anti-aliasing purposes. A quantitative evaluation based on trace-driven simulations using ATUM 8 traces shows that the proposed scheme yields miss ratio improvements of …

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Space-Efficient Virtual Memory Organization for On-Chip Compressed Caches

On-chip compressed cache systems have been recently developed that reduce the cache miss count and off-chip memory traffic by storing and transferring cache line in a compressed form. In order to further expand the main memory capacity, in this paper, we present a spaceefficient virtual memory organization technique for the on-chip compressed caches. Simulation results show that the proposed or...

متن کامل

Reduction in Cache Memory Power Consumption based on Replacement Quantity

Today power consumption is considered to be one of the important issues. Therefore, its reduction plays a considerable role in developing systems. Previous studies have shown that approximately 50% of total power consumption is used in cache memories. There is a direct relationship between power consumption and replacement quantity made in cache. The less the number of replacements is, the less...

متن کامل

Reduction in Cache Memory Power Consumption based on Replacement Quantity

Today power consumption is considered to be one of the important issues. Therefore, its reduction plays a considerable role in developing systems. Previous studies have shown that approximately 50% of total power consumption is used in cache memories. There is a direct relationship between power consumption and replacement quantity made in cache. The less the number of replacements is, the less...

متن کامل

Understanding the Impact of Cache Locations on Storage Performance and Energy Consumption of Virtualization Systems

As per-server CPU cores and DRAM capacity continuously increase, application density of virtualization platforms hikes. High application density imposes tremendous pressure on storage systems. Layers of caches are deployed to improve storage performance. Owing to its manageability and transparency advantages, hypervisorside caching is widely employed. However, hypervisorside caches locate at th...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Microprocessors and Microsystems - Embedded Hardware Design

دوره 17  شماره 

صفحات  -

تاریخ انتشار 1993